Search results for "Long Short-term Memory"

showing 5 items of 5 documents

Forecasting Aquaponic Systems Behaviour With Recurrent Neural Networks Models

2022

Aquaponic systems provide a reliable solution to grow vegetables while cultivating fish (or other aquatic organisms) in a controlled environment. The main advantage of these systems compared with traditional soil-based agriculture and aquaculture installations is the ability to produce fish and vegetables with low water consumption. Aquaponics requires a robust control system capable of optimizing fish and plant growth while ensuring a safe operation. To support the control system, this work explores the design process of Deep Learning models based on Recurrent Neural Networks to forecast one hour of pH values in small-scale industrial Aquaponics. This implementation guides us through the m…

AquaponicsRecurrent Neural NetworkGated Recurrent UnitData-driven ModellingGeneral MedicineVDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550VDP::Landbruks- og Fiskerifag: 900::Fiskerifag: 920Long Short-term MemoryProceedings of the Northern Lights Deep Learning Workshop
researchProduct

A new method for forecasting energy output of a large-scale solar power plant based on long short-term memory networks a case study in Vietnam

2021

Abstract This paper proposes a new model for short-term forecasting power generation capacity of large-scale solar power plant (SPP) in Vietnam considering the fluctuations of weather factors when applying the Long Short-Term Memory networks (LSTM) algorithm. At first, a configuration of the model based on the LSTM algorithm is selected in accordance with the weather and operating conditions of SPP in Vietnam. Not only different structures of LSTM model but also other conventional forecasting methods for time series data are compared in terms of error accuracy of forecast on test data set to evaluate the effectiveness and select the most suitable LSTM configuration. The most suitable config…

Scale (ratio)Computer scienceLarge scale solar power plant020209 energy020208 electrical & electronic engineeringEnergy Engineering and Power Technology02 engineering and technologySet (abstract data type)Mean absolute percentage errorElectricity generationSolar power plantArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineeringLong short-term memoryElectrical and Electronic EngineeringTime seriesPV power plantForecasting PV powerEnergy (signal processing)Test data
researchProduct

Forecasting energy output of a solar power plant in curtailment condition based on LSTM using P/GHI coefficient and validation in training process, a…

2022

This study presents how to improve the short-term forecast of photovoltaic plant's output power by applying the Long Short-Term Memory, LSTM, neural networks for industrial-scale solar power plants in Vietnam under possible curtailment operation. Since the actual output power does not correspond to the available power, new techniques (Global Horizontal Irradiance - GHI interval division, P/GHI factor addition (P - Power)) have been designed and applied for processing errors and missing data. The prediction model (LSTM network, structure of hidden layers, number of nodes) has been developed by the authors in a previous work. In this new version of the model, the training technique is improve…

Settore ING-IND/33 - Sistemi Elettrici Per L'EnergiaLong short-term memory Curtailment Large scale solar power plant Forecasting PV power PV power plant Artificial intelligenceEnergy Engineering and Power TechnologyElectrical and Electronic EngineeringElectric Power Systems Research
researchProduct

Memory degradation induced by attention in recurrent neural architectures

2022

This paper studies the memory mechanisms in recurrent neural architectures when attention models are included. Pure-attention models like Transformers are more and more popular as they tend to outperform models with recurrent connections in many different tasks. Our conjecture is that attention prevents the recurrent connections from transferring information properly between consecutive next steps. This conjecture is empirically tested using five different models, namely, a model without attention, a standard Loung attention model, a standard Bahdanau attention model, and our proposal to add attention to the inputs in order to fill the gap between recurrent and parallel architectures (for b…

attention mechanismsrecurrenceArtificial IntelligenceCognitive Neurosciencelong short-term memory networksgate activationsUNESCO::CIENCIAS TECNOLÓGICASComputer Science Applicationsforget gate
researchProduct

Prediction of Specific TCR-Peptide Binding From Large Dictionaries of TCR-Peptide Pairs

2019

Abstract The T cell repertoire is composed of T cell receptors (TCR) selected by their cognate MHC-peptides and naive TCR that do not bind known peptides. While the task of distinguishing a peptide-binding TCR from a naive TCR unlikely to bind any peptide can be performed using sequence motifs, distinguishing between TCRs binding different peptides requires more advanced methods. Such a prediction is the key for using TCR repertoires as disease-specific biomarkers. We here used large scale TCR-peptide dictionaries with state-of-the-art natural language processing (NLP) methods to produce ERGO (pEptide tcR matchinG predictiOn), a highly specific classifier to predict which TCR binds to which…

lcsh:Immunologic diseases. AllergyComputer scienceevaluation methodsT-LymphocytesT cellImmunologyReceptors Antigen T-CellEpitopes T-LymphocyteTarget peptidePeptide bindingPeptidechemical and pharmacologic phenomenaComputational biologyLigandsSoftware implementationautoencoder (AE)AntigenEvaluation methodsmedicineImmunology and AllergyHumansProtein Interaction Domains and MotifsEpitope specificityAntigensDatabases ProteinOriginal Researchchemistry.chemical_classificationBinding SitesT cell repertoireChemistryRepertoirelong short-term memory (LSTM)T-cell receptorepitope specificitydeep learninghemic and immune systemsmedicine.anatomical_structuremachine learningPeptidesSequence motiflcsh:RC581-607SoftwareProtein BindingSignal TransductionTCR repertoire analysisFrontiers in Immunology
researchProduct